Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
65k Long Context
# 65k Long Context
Mixtral 8x22B V0.1 GGUF
Apache-2.0
Mixtral 8x22B is a 176-billion-parameter mixture of experts model released by MistralAI, supporting multilingual text generation tasks.
Large Language Model
Supports Multiple Languages
M
MaziyarPanahi
170.27k
74
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase